Skip to content

MkLlm

Show source on GitHub

Node for LLM-based text generation.

Example: Regular

Jinja

{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}

Python

MkLlm('Write a poem about MkDocs', [])

In the realm where knowledge thrives,
MkDocs reigns, where clarity jives.
With Markdown’s grace and ease of prose,
It builds your docs, as wisdom flows.

A simple syntax, a friend so fair,
Crafting pages with meticulous care.
From headers strong to lists that gleam,
In every line, a thoughtful dream.

Themes and styles, so rich, so bright,
Transforming texts into sheer delight.
With a command, mkdocs serve at play,
Your docs come alive in a vibrant way.

Navigation's smooth, like a gentle stream,
And search functionality fulfills the dream.
With every click, your users explore,
Finding insights, knowledge, and more.

Version control, a savior’s grace,
Documentation kept in its rightful place.
From code to usage, the journey unfolds,
In every project, a story retold.

So here’s to MkDocs, a beacon, a guide,
In the world of documentation, you stand with pride.
With every line penned in your embrace,
You turn chaos to order, and give clarity space.

In the realm where knowledge thrives,  
MkDocs reigns, where clarity jives.  
With Markdown’s grace and ease of prose,  
It builds your docs, as wisdom flows.  

A simple syntax, a friend so fair,  
Crafting pages with meticulous care.  
From headers strong to lists that gleam,  
In every line, a thoughtful dream.  

Themes and styles, so rich, so bright,  
Transforming texts into sheer delight.  
With a command, `mkdocs serve` at play,  
Your docs come alive in a vibrant way.  

Navigation's smooth, like a gentle stream,  
And search functionality fulfills the dream.  
With every click, your users explore,  
Finding insights, knowledge, and more.  

Version control, a savior’s grace,  
Documentation kept in its rightful place.  
From code to usage, the journey unfolds,  
In every project, a story retold.  

So here’s to MkDocs, a beacon, a guide,  
In the world of documentation, you stand with pride.  
With every line penned in your embrace,  
You turn chaos to order, and give clarity space.  
<p>In the realm where knowledge thrives,<br>
MkDocs reigns, where clarity jives.<br>
With Markdown’s grace and ease of prose,<br>
It builds your docs, as wisdom flows.  </p>
<p>A simple syntax, a friend so fair,<br>
Crafting pages with meticulous care.<br>
From headers strong to lists that gleam,<br>
In every line, a thoughtful dream.  </p>
<p>Themes and styles, so rich, so bright,<br>
Transforming texts into sheer delight.<br>
With a command, <code>mkdocs serve</code> at play,<br>
Your docs come alive in a vibrant way.  </p>
<p>Navigation's smooth, like a gentle stream,<br>
And search functionality fulfills the dream.<br>
With every click, your users explore,<br>
Finding insights, knowledge, and more.  </p>
<p>Version control, a savior’s grace,<br>
Documentation kept in its rightful place.<br>
From code to usage, the journey unfolds,<br>
In every project, a story retold.  </p>
<p>So here’s to MkDocs, a beacon, a guide,<br>
In the world of documentation, you stand with pride.<br>
With every line penned in your embrace,<br>
You turn chaos to order, and give clarity space.  </p>

Bases: MkText

text property

text: str

__init__

__init__(
    user_prompt: str,
    system_prompt: str | None = None,
    model: str = "openai:gpt-4o-mini",
    context: str | None = None,
    extra_files: Sequence[str | PathLike[str]] | None = None,
    **kwargs: Any
)

Parameters:

Name Type Description Default
user_prompt str

Main prompt for the LLM

required
system_prompt str | None

System prompt to set LLM behavior

None
model str

LLM model identifier to use

'openai:gpt-4o-mini'
context str | None

Main context string

None
extra_files Sequence[str | PathLike[str]] | None

Additional context files or strings

None
kwargs Any

Keyword arguments passed to parent

{}
Name Children Inherits
MkText
mknodes.basenodes.mktext
Class for any Markup text.
graph TD
  94599705988688["mkllm.MkLlm"]
  94599705611968["mktext.MkText"]
  94599705097232["mknode.MkNode"]
  94599703461184["node.Node"]
  140153667328480["builtins.object"]
  94599705611968 --> 94599705988688
  94599705097232 --> 94599705611968
  94599703461184 --> 94599705097232
  140153667328480 --> 94599703461184
/home/runner/work/mknodes/mknodes/mknodes/templatenodes/mkllm/metadata.toml
[metadata]
icon = "mdi:view-grid"
status = "new"
name = "MkLlm"

[examples.regular]
title = "Regular"
jinja = """
{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}
"""

# [output.markdown]
# template = """
# <div class="grid cards" markdown="1">

# {% for item in node.items %}
# -   {{ item | indent }}
# {% endfor %}
# </div>
# """
mknodes.templatenodes.mkllm.MkLlm
class MkLlm(mktext.MkText):
    """Node for LLM-based text generation."""

    ICON = "material/format-list-group"
    REQUIRED_PACKAGES = [resources.Package("litellm")]

    def __init__(
        self,
        user_prompt: str,
        system_prompt: str | None = None,
        model: str = "openai:gpt-4o-mini",
        context: str | None = None,
        extra_files: Sequence[str | os.PathLike[str]] | None = None,
        **kwargs: Any,
    ):
        """Constructor.

        Args:
            user_prompt: Main prompt for the LLM
            system_prompt: System prompt to set LLM behavior
            model: LLM model identifier to use
            context: Main context string
            extra_files: Additional context files or strings
            kwargs: Keyword arguments passed to parent
        """
        super().__init__(**kwargs)
        self.user_prompt = user_prompt
        self.system_prompt = system_prompt
        self._model = model
        self._context = context
        self._extra_files = extra_files or []

    def _process_extra_files(self) -> list[str]:
        """Process extra context items, reading files if necessary.

        Returns:
            List of context strings.
        """
        from upathtools import to_upath

        context_items: list[str] = []

        def process_dir(path: UPath) -> list[str]:
            return [f.read_text() for f in path.rglob("*") if f.is_file()]

        for item in self._extra_files:
            try:
                path = to_upath(item)
                if path.is_file():
                    context_items.append(path.read_text())
                elif path.is_dir():
                    context_items.extend(process_dir(path))
                else:
                    context_items.append(str(item))
            except Exception as exc:
                err_msg = f"Failed to read context file: {item}"
                logger.warning(err_msg)
                raise ValueError(err_msg) from exc

        return context_items

    @property
    def text(self) -> str:
        """Generate text using the LLM.

        Returns:
            Generated text content.
        """
        context_items = self._process_extra_files()
        combined_context = (
            "\n".join(filter(None, [self._context, *context_items])) or None
        )

        return complete_llm(
            self.user_prompt,
            self.system_prompt or "",
            model=self._model,
            context=combined_context or "",
        )